20 research outputs found

    Review of LiDAR Sensor Data Acquisition and Compression for Automotive Applications

    No full text
    Due to specific dynamics of the operating environment and required safety regulations, the amount of acquired data of an automotive LiDAR sensor that has to be processed is reaching several Gbit/s. Therefore, data compression is much-needed to enable future multi-sensor automated vehicles. Numerous techniques have been developed to compress LiDAR raw data; however, these techniques are primarily targeting a compression of 3D point cloud, while the way data is captured and transferred from a sensor to an electronic computing unit (ECU) was left out. The purpose of this paper is to discuss and evaluate how various low-level compression algorithms could be used in the automotive LiDAR sensor in order to optimize on-chip storage capacity and link bandwidth. We also discuss relevant parameters that affect amount of collected data per second and what are the associated issues. After analyzing compressing approaches and identifying their limitations, we conclude several promising directions for future research

    Simulating Rain Droplets Influence on Distance Measurement with a Time-of-Flight Camera Sensor

    No full text
    Time-of-Flight (ToF) camera sensors measure simultaneously the light intensity and the scene distance on a pixel basis. Environmental effects, like rain droplets between the scene and the ToF camera, have an impact on the distance accuracy of the sensor. Optical raytracing simulations were performed to study rain influence in detail. The 3D simulation setup comprises all relevant elements including the sensor design, the object/scene geometry and a model for the environmental conditions. Specifically, a setup with small-angle ToF camera optics is investigated and a comparison of the influence of several typical rain intensities is presented. The simulation results serve as an input for developing error-compensation algorithms

    Fog Effects on Time-of-Flight Imaging Investigated by Ray-Tracing Simulations

    No full text
    Time-of-Flight (ToF) sensors are a key technology for autonomous vehicles and autonomous mobile robotics. Quantifying the extent of perturbation induced by atmospheric phenomena on ToF imaging is critical to identify effective correction strategies. Here we present an approach that uses optical ray-tracing to simulate the ToF image, while the distance information is recovered by analyzing the optical path of each ray. Such an approach allows, for example, understanding the effects of different ray paths on the ToF image, or testing various retrieval/correction algorithms upon running a single ray-tracing simulation. By modelling several scattering scenarios, we show that ranging errors arise mostly from light backscattered to the sensor prior reaching the scene. Scattering events close to the sensors (<1 m) have the largest influence, therefore strategies capable of filtering out signals from distances shorter than the range of interest can significantly improve the accuracy of ToF sensors

    Investigating Intense Rainfall Influence on Distance Measurement with a Time-of-Flight Camera Sensor Using Optical Ray-Tracing Simulation Technique

    No full text
    Robust, fast and reliable examination of the surroundings is essential for further advancements in autonomous driving and robotics. Time-of-Flight (ToF) camera sensors are a key technology to measure surrounding objects and their distances on a pixel basis in real-time. Environmental effects, like rain in front of the sensor, can influence the distance accuracy of the sensor. Here we use an optical ray-tracing based procedure to examine the rain effect on the ToF image. Simulation results are presented for experimental rain droplet distributions, characteristic of intense rainfall at rates of 25 mm/h and 100 mm/h. The ray-tracing based simulation data and results serve as an input for developing and testing rain signal suppression strategies
    corecore